Bayesian Growing and Pruning Strategies for Map-optimal Estimation of Gaussian Mixture Models Bayesian Growing and Pruning Strategies for Map-optimal Estimation of Gaussian Mixture Models
نویسنده
چکیده
Real-time learning requires on-line complexity estimation. Expectation-maximisation (EM) and sampling techniques are presented that enable simultaneous estimation of the complexity and continuous parameters of Gaussian mixture models (GMMs) which can be used for density estimation, classiication and feature extraction. The solution is a maximum a posteriori probability (MAP) estimator that is convergent for xed data and adaptive with accruing data. Issues resolved include estimating the priors for element covariances, means and weights and calculating the local integrated likelihood (evidence) of the solution. The EM algorithm for MAP estimation of GMM parameters is established and extended to include complexity estimation (ie. iterative pruning). The EMS algorithm is introduced which incorporates a sampling stage that enables iterative growth of the GMM. Early trials involving speech data indicate that the likelihood of hidden Markov speech models can be very substantially increased using this approach.
منابع مشابه
Speech Enhancement Using Gaussian Mixture Models, Explicit Bayesian Estimation and Wiener Filtering
Gaussian Mixture Models (GMMs) of power spectral densities of speech and noise are used with explicit Bayesian estimations in Wiener filtering of noisy speech. No assumption is made on the nature or stationarity of the noise. No voice activity detection (VAD) or any other means is employed to estimate the input SNR. The GMM mean vectors are used to form sets of over-determined system of equatio...
متن کاملBayesian Growingand Pruning Strategies Formap-optimal Estimation of Gaussian Mixture Models
Real-time learning requires on-line complexity estimation. Expectation-maximisation (EM) and sampling techniques are presented that enable simultaneous estimation of the complexity and continuous parameters of Gaussian mixture models (GMMs) which can be used for density estimation, classi cation and feature extraction. The solution is a maximum a posteriori probability (MAP) estimator that is c...
متن کاملPruning of state-tying tree using bayesian information criterion with multiple mixtures
The use of context-dependent phonetic units together with Gaussian mixture models allows modern-day speech recognizer to build very complex and accurate acoustic models. However, because of data sparseness issue, some sharing of data across di erent triphone states is necessary. The acoustic model design is typically done in two stages, namely, designing the state-tying map and growing the numb...
متن کاملDirichlet Process Parsimonious Mixtures for clustering
The parsimonious Gaussian mixture models, which exploit an eigenvalue decomposition of the group covariance matrices of the Gaussian mixture, have shown their success in particular in cluster analysis. Their estimation is in general performed by maximum likelihood estimation and has also been considered from a parametric Bayesian prospective. We propose new Dirichlet Process Parsimonious mixtur...
متن کاملVoice activity detection using frame-wise model re-estimation method based on Gaussian pruning with weight normalization
This paper proposes a frame-wise model re-estimation method based on Gaussian pruning with weight normalization for noise robust voice activity detection (VAD). Our previous work, switching Kalman filter-based VAD, sequentially estimates a non-stationary noise Gaussian mixture model (GMM) and constructs GMMs of observed noisy speech signals by composing pre-trained silence and clean GMMs and se...
متن کامل